Recent #Memory Technology news in the semiconductor industry

11 months ago
1. Micron Technology introduces the MRDIMM, a server memory designed to enhance bandwidth and reduce latency compared to existing RDIMMs; 2. The MRDIMM is compatible with Xeon 6 processors and is already available for shipping, with mass production planned for the second half of 2024; 3. It is designed for AI inference and HPC applications, offering improved thermal design and energy efficiency for memory-intensive workloads.
Memory TechnologyMicron TechnologyServer Hardware
2 months ago

Jülich researchers have introduced novel memristive components in Nature Communications, offering significant advantages over previous versions. These memristors are more robust, operate within a wider voltage range, and can be used in both analog and digital modes. They could address the issue of 'catastrophic forgetting' in artificial neural networks, where learned information is abruptly lost.

The researchers have implemented the new memristive element in a model of artificial neural networks, achieving high accuracy in pattern recognition. They plan to seek further materials for memristors that may perform even better than the current version.

ElectrochemistryMemory Technologyartificial intelligencematerial science
6 months ago

➀ The CXL 3.2 specification has been officially released by the CXL consortium, focusing on optimizing the monitoring and management of CXL memory devices and enhancing their functionality for operating systems and applications.

➁ The specification also extends security through Trusted Security Protocol (TSP) and ensures full backward compatibility with previous CXL specifications.

➂ The CXL technology is designed for high-performance data center servers, allowing processor modules to share memory and providing low-latency interconnect paths for memory access and communication between host processors and devices that need shared memory resources.

CXLMemory Technology
6 months ago

➀ The chip industry is striving to increase the stack height of 3D NAND flash memory from 200 layers to 800 layers or more in the coming years to meet the endless demand for various types of memory.

➁ The additional layers will bring new reliability issues and a series of incremental reliability challenges, but the NAND flash memory industry has been steadily increasing the stack height over the past decade.

➂ The development direction of 3D NAND is from 500 to 1,000 layers. However, achieving so many layers is not just a matter of doing more of what we have been doing.

3D NANDMemory TechnologyNAND flash
7 months ago

➀ Stanford University is researching a hybrid memory that combines the density of DRAM with the speed of SRAM, funded by CHIPS and Science Act.

➁ The research is part of the California Pacific Northwest AI Hardware Center project, which will receive $16.3 million from the US Department of Defense.

➂ The team, led by H.S. Philip Wong, focuses on developing more energy-efficient hardware for AI, with memory being the core.

➃ The hybrid gain cell memory combines the small footprint of DRAM with the almost as fast speed of SRAM.

➄ The gain cell, similar to DRAM, uses a second transistor instead of a capacitor to store data, with the data stored as charge on the gate of the transistor.

➅ Reading signals are无损 in the gain cell, and the reading transistor provides gain to the storage transistor during reading.

➆ Liu and Wong's mixed gain cell memory, combining silicon read transistors with indium tin oxide write transistors, overcomes limitations and achieves a data retention time of over 5000 seconds.

➇ These hybrid storage cells can be integrated into logic chips, potentially changing the way memory is used in computers.

AI HardwareMemory Technologyenergy efficiency
8 months ago
➀ HBM4 is the key to advancing AI by providing high capacity and performance for large-scale data-intensive applications; ➁ HBM4 improves AI and ML performance through increased bandwidth and memory density, reducing bottlenecks and improving system performance; ➂ HBM4 is designed with energy efficiency in mind, achieving better performance per watt and is crucial for the sustainability of large-scale AI deployments; ➃ HBM4's scalability allows for growth without becoming too expensive or inefficient, making it crucial for deploying AI in various applications.
AIAI PerformanceHBM4Memory Technologyenergy efficiency
8 months ago
➀ SK hynix has started mass production of 12-Hi HBM3E memory stacks, setting the stage for next-generation AI and HPC processors; ➁ The new modules offer a peak bandwidth of 1.22 TB/s per module and a total of 9.83 TB/s with eight stacks; ➂ SK hynix is the first company to mass produce 12-Hi HBM3E memory, with plans to ship by the end of the year for AMD's Instinct MI325X and Nvidia's Blackwell Ultra.
AI ProcessorsHBM3eMass ProductionMemory TechnologySK Hynix
9 months ago
➀ SK Hynix aims to develop differentiated HBM products to meet specific customer needs in the AI sector. ➁ The company's Advanced Package Development leader emphasizes the need for flexible and scalable technology in AI memory development. ➂ HBM memory has evolved rapidly, with HBM3E reaching 9.2 GT/s - 10 GT/s and HBM4 set to feature a 2048-bit interface. ➃ SK Hynix plans to offer customized or semi-customized HBM4 solutions, considering various customer preferences and technologies.
AIHBMMemory Technology
12 months ago
1. The article discusses the ongoing debate at the International Solid-State Circuits Conference (ISSCC) over the successors to NAND memory, which remains the preferred choice for non-volatile memory chips. 2. It highlights various approaches by industry leaders like Samsung, Hitachi, Infineon, and Motorola to develop new memory technologies, including chalcogenide-based memory, NanoCrystal memory, and MRAM. 3. Despite 20 years of research and development, no clear successor to NAND has emerged, indicating the complexity and challenges in advancing memory technology.
ISSCCMemory TechnologyNAND memory